Path: blob/master/Part 3 - Classification/Logistic Regression/[Python] Logistic Regression.ipynb
1009 views
Kernel: Python 3
Logistic Regression
Data preprocessing
In [5]:
In [6]:
Out[6]:
In [7]:
Out[7]:
array([[ 19, 19000],
[ 35, 20000],
[ 26, 43000],
[ 27, 57000],
[ 19, 76000],
[ 27, 58000],
[ 27, 84000],
[ 32, 150000],
[ 25, 33000],
[ 35, 65000]])
In [8]:
Out[8]:
array([0, 0, 0, 0, 0, 0, 0, 1, 0, 0])
In [9]:
In [11]:
In [12]:
Out[12]:
array([[-1.06675246, -0.38634438],
[ 0.79753468, -1.22993871],
[ 0.11069205, 1.853544 ],
[ 0.60129393, -0.90995465],
[ 1.87685881, -1.28811763],
[-0.57615058, 1.44629156],
[ 0.3069328 , -0.53179168],
[ 0.99377543, 0.10817643],
[-1.16487283, 0.45724994],
[-1.55735433, 0.31180264]])
Fitting Logistic Regression to the Training Set
In [13]:
Out[13]:
LogisticRegression(C=1.0, class_weight=None, dual=False, fit_intercept=True,
intercept_scaling=1, max_iter=100, multi_class='ovr', n_jobs=1,
penalty='l2', random_state=42, solver='liblinear', tol=0.0001,
verbose=0, warm_start=False)
Predicting Test set result
In [14]:
In [15]:
Out[15]:
array([0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0])
In [16]:
Out[16]:
array([0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0])
This prediction looks good.
Making the Confusion Matrix
In [17]:
In [18]:
Out[18]:
array([[50, 2],
[ 8, 20]])
classifier made 50 + 20 = 70 correct prediction and 8 + 2 = 10 incoreect predictions.
Visualizing the training set results
In [19]:
In [37]:
In [38]:
In [39]:
Out[39]:
<matplotlib.contour.QuadContourSet at 0x7f08769bdbe0>
In [40]:
Out[40]:
Merging above plots and labeling axis
In [41]:
Out[41]:
<matplotlib.legend.Legend at 0x7f0876970cc0>
Visualizing the test set results
In [42]:
Out[42]:
<matplotlib.legend.Legend at 0x7f08769b4518>